Global Average Land Temperature Time Series forecasting with a simple Gated Recurrent Units (GRU) Neural Network architecture, using TensorFlow, Keras and Talos.


Time Series forecasting is one of Machine Learning challenges, especially when it comes to weather parameters, as these a subject to multiple processes influence that may hinder a better learning of these sequences of values. In this piece of work we are going to illustrate the power Deep Learning in tackling a typical problem, viz., forecasting the global average land temperature using a 2-layer GRU. The dataset contains monthly temperatures records ranging from 1750, for the longest time series, to 2015.
Just like performing a Grid-/Random-Search with Scikit-Learn, a set of hyperparameters will be optimized using Talos.
In [1]:
##################################################################################################################
##*********                    Moukouba Moutoumounkata, July 2020              ***************                  ##
##                Global Land Average Temperature Time Series Forecasting                                       ##
##                *******************************************************                                       ##
##################################################################################################################



#we will try to build a reproductible experience, although without 
#guarantee, due to the extremely random nature of Neural Networks
from numpy.random import seed
seed(1)
from tensorflow import set_random_seed
set_random_seed(2)

import numpy as np
import pandas as pd
from operator import itemgetter
import matplotlib.pyplot as plt
import chart_studio.plotly as py
import plotly.express as px
import plotly.graph_objs as go
import seaborn as sns

from sklearn.preprocessing import StandardScaler, MinMaxScaler
from sklearn.model_selection import train_test_split
from sklearn.metrics import r2_score

# Importing the Keras libraries and packages from TensorFlow
from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import (Input, Dense, Dropout, LSTM, Flatten, 
                                     GRU, Bidirectional, TimeDistributed)
from tensorflow.keras.activations import relu, linear
from tensorflow.keras.optimizers import SGD, Adam, RMSprop
from tensorflow.keras.models import model_from_json
from tensorflow.keras import backend as K
import tensorflow as tf

import talos
from talos import scan, Evaluate, Reporting 
from talos.utils import early_stopper, lr_normalizer
from talos.utils.gpu_utils import parallel_gpu_jobs
from talos.utils.recover_best_model import recover_best_model

import warnings
warnings.filterwarnings("ignore")

%matplotlib inline

df = pd.read_csv("/home/moukouba/data_science/python/datasets/globaltemperatures.csv")
Using TensorFlow backend.
In [2]:
df.head()
Out[2]:
dt landaveragetemperature landaveragetemperatureuncertainty landmaxtemperature landmaxtemperatureuncertainty landmintemperature landmintemperatureuncertainty landandoceanaveragetemperature landandoceanaveragetemperatureuncertainty
0 1750-01-01 3.034 3.574 NaN NaN NaN NaN NaN NaN
1 1750-02-01 3.083 3.702 NaN NaN NaN NaN NaN NaN
2 1750-03-01 5.626 3.076 NaN NaN NaN NaN NaN NaN
3 1750-04-01 8.490 2.451 NaN NaN NaN NaN NaN NaN
4 1750-05-01 11.573 2.072 NaN NaN NaN NaN NaN NaN
In [3]:
df.shape
Out[3]:
(3192, 9)
In [4]:
df["dt"] = df.dt.apply(pd.to_datetime, errors='coerce')

df1 = df[["dt", "landaveragetemperature", "landaveragetemperatureuncertainty"]].set_index("dt")
df2 = df[["dt", "landmintemperature", "landmintemperatureuncertainty"]].set_index("dt")

print(df1.isnull().sum())
print(df2.isnull().sum())
df1.head(15)
landaveragetemperature               12
landaveragetemperatureuncertainty    12
dtype: int64
landmintemperature               1200
landmintemperatureuncertainty    1200
dtype: int64
Out[4]:
landaveragetemperature landaveragetemperatureuncertainty
dt
1750-01-01 3.034 3.574
1750-02-01 3.083 3.702
1750-03-01 5.626 3.076
1750-04-01 8.490 2.451
1750-05-01 11.573 2.072
1750-06-01 12.937 1.724
1750-07-01 15.868 1.911
1750-08-01 14.750 2.231
1750-09-01 11.413 2.637
1750-10-01 6.367 2.668
1750-11-01 NaN NaN
1750-12-01 2.772 2.970
1751-01-01 2.495 3.469
1751-02-01 0.963 3.827
1751-03-01 5.800 3.051
In [5]:
df1.shape
Out[5]:
(3192, 2)
In [6]:
df2 = df2.dropna()
df2.head()
Out[6]:
landmintemperature landmintemperatureuncertainty
dt
1850-01-01 -3.206 2.822
1850-02-01 -2.291 1.623
1850-03-01 -1.905 1.410
1850-04-01 1.018 1.329
1850-05-01 3.811 1.347
In [ ]:
 
As can be seen, the sequence has a few missing values (#12). These will be imputed using the Long Term Mean (LTM) of specific day, but not the LTM of all the series, so as not to affect the range of values that specific day may take.
In [7]:
#We define a function that replaces the missing entries with the long term mean of that specific day of the year
def imputer(df):
    #The actual dates with missing values within all the dataset
    dates_nan = pd.Series([index for index, row in df.iterrows() if row.isnull().any()])
    
    #The specific days of the year with some missing values (without duplicates)
    days_nan = dates_nan.dt.strftime('%m-%d').drop_duplicates()
    
    #The Sets of specific days with missing values
    set_with_missing = [[index for index in df.index if day in str(index)] for day in days_nan]
    
    #Now, we can replace the missings with, say, the mean/mode of each set
    for missing in dates_nan:
        for labels in set_with_missing:
            if missing in labels:
                df.loc[missing,] = df.loc[labels,].mean() 
    return df
In [ ]:
 
In [8]:
df1 = imputer(df1)

df1.head(15)
Out[8]:
landaveragetemperature landaveragetemperatureuncertainty
dt
1750-01-01 3.034000 3.574000
1750-02-01 3.083000 3.702000
1750-03-01 5.626000 3.076000
1750-04-01 8.490000 2.451000
1750-05-01 11.573000 2.072000
1750-06-01 12.937000 1.724000
1750-07-01 15.868000 1.911000
1750-08-01 14.750000 2.231000
1750-09-01 11.413000 2.637000
1750-10-01 6.367000 2.668000
1750-11-01 5.701539 0.961708
1750-12-01 2.772000 2.970000
1751-01-01 2.495000 3.469000
1751-02-01 0.963000 3.827000
1751-03-01 5.800000 3.051000
In [ ]:
 
Although the main focus of the work is to try and forecast the Land Average Temperature (LAT) only, it has been well worthwhile to pinpoint how global temperatures (average and minimum) have varied over the years, especially since 1900, in the context of Climate Change. That is, a succinct analysis of the Land Minimum Temperature (LMT) has been done accordingly. This has been done through graphical analysis of the series line plots.
In [9]:
#x1, x2, x3, x4, x5 = itemgetter(0, 1, 2, 3, 4)(np.array_split(X, 5))
x = np.array_split(df1, 3)

for data in x:
    fig = px.line(data["landaveragetemperature"])
    fig.show()
In [10]:
fig = px.line(df1["landaveragetemperatureuncertainty"])
fig.show()
In [11]:
#z1, z2, z3, z4, z5 = itemgetter(0, 1, 2, 3, 4)(np.array_split(Z, 5))

z = np.array_split(df2, 3)

for data in z:
    fig = px.line(data["landmintemperature"])
    fig.show()
In [13]:
#Subsetting one month, say July, and plot the series' curve
df3 = df1[df1.index.month == 7]
df4 = df2[df2.index.month == 7]

fig = px.line(df3["landaveragetemperature"])
fig.show()
In [14]:
fig = px.line(df4["landmintemperature"])
fig.show()
Carefully looking at the above graphics, we can say that both the LAT and the LMT are indeed rising! It can be noticed that, up to 1930, the LAT was fluctuating about $14°C$; but from 1930 on, the LAT has gone well above this value, exceeding $15°C$ from 1995 onwards. Furthermore, the LMT is also on the rise, with values below $-3°C$ up to 1954; and from this year, the LMT has risen above $-3°C$ and has generally gone above $-2°C$.

Finally, the two last graphics above, which exemplify temperature variations during the month of July, conspicuously show that global warming is a reality, as temperatures, especially the LMT, have been continuously rising since 1900.

Worthy of note, since the World did not have a dense network of observation stations until the last century, these values have been mostly estimated through Re-analysis, and the uncertainty curve shows that the farther we go back in time, the larger the uncertainty in estimation, with values approaching zero in recent decades. Therefore, values up to the end of 1800s should be considered with a grain of salt, as they tend to be inconsistent with the above observations.

Data Preparation

GRUs are Recurrent Neural Network models, and expect three-dimensional input with the shape [samples, timesteps, features]. The data will, thus, be reshaped accordingly. But first, we check for the consistency of the data; then through a certain number of transformations before feeding it to the model.
In [12]:
#Plotting boxplots to check for outliers

trace0 = go.Box(y=df2["landmintemperature"], name="Min_temp")
trace1 = go.Box(y=df1["landaveragetemperature"], name="Avg_temp")

sequences = [trace0, trace1]
fig = go.Figure(sequences)
fig.show()
The above boxplots may be missleading, as they show that the data is free from outliers. But a careful understanding of the variability of temperature with respect to months/seasons is enough to consider each month individually; for example, considering the month of February, we can see that it's not exempt from outliers, as shown in the graphic below. But, as stipulated above, we want to investigate the power of Deep Learing, so will we leave the algorithm to learn and make forecasts.
In [15]:
#Boxplots for the specific month of July
trace0 = go.Box(y=df4["landmintemperature"], name="Min_temp")
trace1 = go.Box(y=df3["landaveragetemperature"], name="Avg_temp")

sequences = [trace0, trace1]
fig = go.Figure(sequences)
fig.show()
In [14]:
#We define a function that transforms the data to the required shape
def convert2matrix(data_arr, n_steps):
    X, y = [], []
    for i in range(len(data_arr) - n_steps):
        d = i + n_steps  
        X.append(data_arr[i:d, ])
        y.append(data_arr[d, ])
    return np.array(X), np.array(y)

#We define a function that splits the data into train, validation and test sets
def data_splitter(df, train_fraction):
    train_size = int(len(df)*train_fraction)
    train, test = df[0:train_size, ], df[train_size:len(df), ]
    return train, test
In [15]:
#Split the data into train, validation and test series
series = df1["landaveragetemperature"].values 

train, test = data_splitter(series, 0.85)
train, val = data_splitter(series, 0.75)
print(len(train), len(val), len(test))
2394 798 479
In [16]:
#Convert dataset into right shape to turn the problem into a supervised... 
#... learning one. We choose n_steps = 36, thus, we use we look back up to... 
#... 3 years (36) consecutive months to be able to forecast the next month

n_steps = 36

X_train, y_train = convert2matrix(train, n_steps)
X_val, y_val = convert2matrix(val, n_steps)
X_test, y_test = convert2matrix(test, n_steps)

#Scale the data
b_scaled = X_train.copy()
b_scaled_val = X_val.copy()
b_scaled_test = X_test.copy()

scaler = MinMaxScaler(feature_range=(0, 1))
x_train = scaler.fit_transform(b_scaled)
x_val = scaler.transform(b_scaled_val)
x_test = scaler.transform(b_scaled_test)

# reshape input to be [samples, time steps, features]
x_train = np.reshape(x_train, (x_train.shape[0], 1, x_train.shape[1]))
x_val = np.reshape(x_val, (x_val.shape[0], 1, x_val.shape[1]))
x_test = np.reshape(x_test, (x_test.shape[0], 1, x_test.shape[1]))
In [17]:
b_scaled_test.shape
Out[17]:
(443, 36)
In [18]:
for i in range(5):
    print(b_scaled_test[i], y_test[i])
[ 3.031  4.517  8.294 10.942 13.086 14.155 13.511 11.895  8.511  5.66
  3.681  2.492  3.471  5.702  8.85  11.78  13.876 14.631 14.09  11.862
  9.156  6.544  3.749  2.705  3.456  5.607  8.791 11.414 13.22  14.364
 13.297 12.03   9.339  6.35   3.74   2.679] 2.841
[ 4.517  8.294 10.942 13.086 14.155 13.511 11.895  8.511  5.66   3.681
  2.492  3.471  5.702  8.85  11.78  13.876 14.631 14.09  11.862  9.156
  6.544  3.749  2.705  3.456  5.607  8.791 11.414 13.22  14.364 13.297
 12.03   9.339  6.35   3.74   2.679  2.841] 5.474
[ 8.294 10.942 13.086 14.155 13.511 11.895  8.511  5.66   3.681  2.492
  3.471  5.702  8.85  11.78  13.876 14.631 14.09  11.862  9.156  6.544
  3.749  2.705  3.456  5.607  8.791 11.414 13.22  14.364 13.297 12.03
  9.339  6.35   3.74   2.679  2.841  5.474] 8.455
[10.942 13.086 14.155 13.511 11.895  8.511  5.66   3.681  2.492  3.471
  5.702  8.85  11.78  13.876 14.631 14.09  11.862  9.156  6.544  3.749
  2.705  3.456  5.607  8.791 11.414 13.22  14.364 13.297 12.03   9.339
  6.35   3.74   2.679  2.841  5.474  8.455] 11.199000000000002
[13.086 14.155 13.511 11.895  8.511  5.66   3.681  2.492  3.471  5.702
  8.85  11.78  13.876 14.631 14.09  11.862  9.156  6.544  3.749  2.705
  3.456  5.607  8.791 11.414 13.22  14.364 13.297 12.03   9.339  6.35
  3.74   2.679  2.841  5.474  8.455 11.199] 13.487
Now, we build a basic GRU neural network, with two hidden layers and one dropout-regularization layer. Two metrics (the Mean Absolute Error and the Coefficient of Determination (r_squared)), will be used.
In [33]:
# Setting the dictionary of the hyperparameter to be included in the optimisation process 
p = {'epochs': (20, 150, 5),
     'neurons1': [ 32, 64, 128, 256],
     'neurons2': [32, 64, 128, 256],
     'dropout': (0.1, 0.6, 5),
     'loss': ['mse', 'mae'],
     'activation1':[relu, None,],
     'activation2':[linear, None,],
     'batch_size': [8, 16, 32, 64, 128],
     'optimizer': ['Adam', 'SGD', 'RMSprop']
     }
In [34]:
#Define custom coefficient of determination metric
def r_squared(y_true, y_pred):
    SS_res =  K.sum(K.square(y_true-y_pred)) 
    SS_tot = K.sum(K.square(y_true - K.mean(y_true))) 
    return (1 - SS_res/(SS_tot + K.epsilon()))


#We define a model builder function. We wrap the first hidden layer into a Bidirectional layer 
def model_builder(x_train, y_train, x_val, y_val, params, n_steps = 36, n_features = 1):
    
    tf.keras.backend.clear_session()
    
    model = Sequential([
        Bidirectional(GRU(params['neurons1'], return_sequences=True, 
                          activation=params['activation1'], input_shape=(n_features, n_steps))),
        GRU(params['neurons2'], activation=params['activation1']),
        Dropout(params['dropout']),
        
        Dense(1, activation=params['activation2'])        
    ])
    
    model.compile(optimizer=params['optimizer'], loss=params['loss'], metrics=['mae', r_squared])
    
    history = model.fit(x_train, y_train, epochs=params['epochs'], batch_size=params['batch_size'],  verbose=0, 
                        validation_data=[x_val, y_val], callbacks=[early_stopper(epochs=params['epochs'], 
                                                                                 mode='moderate', monitor='val_loss')])
    
    return history, model
Next, we run the experiments creating a Scan object (scrutinizer) and split the GPU memory in two for two parallel jobs; and for the sake of computational burden, only 1% of the hyperparameter space, randomly downsampled, was used - that is, 480 rounds - and the process took 1 hour and 34 minutes to execute, on a modest GTX-1050 GPU.
In [35]:
parallel_gpu_jobs(0.5)

scrutinizer = talos.Scan(x=x_train, y=y_train, x_val=x_test, y_val=y_test, seed=42, 
                         model=model_builder, experiment_name='time_series__gru_hpo_1', 
                         params=p,fraction_limit=0.01, reduction_metric='val_loss')
  0%|          | 0/480 [00:00<?, ?it/s]
  0%|          | 1/480 [00:07<1:00:04,  7.53s/it]
  0%|          | 2/480 [00:13<56:49,  7.13s/it]  
  1%|          | 3/480 [00:31<1:21:13, 10.22s/it]
  1%|          | 4/480 [00:53<1:49:22, 13.79s/it]
  1%|          | 5/480 [01:01<1:35:51, 12.11s/it]
  1%|▏         | 6/480 [01:04<1:15:15,  9.53s/it]
  1%|▏         | 7/480 [01:21<1:31:19, 11.58s/it]
  2%|▏         | 8/480 [01:34<1:34:20, 11.99s/it]
  2%|▏         | 9/480 [02:23<3:01:30, 23.12s/it]
  2%|▏         | 10/480 [02:31<2:24:55, 18.50s/it]
  2%|▏         | 11/480 [02:38<1:59:20, 15.27s/it]
  2%|▎         | 12/480 [02:49<1:47:34, 13.79s/it]
  3%|▎         | 13/480 [03:02<1:46:27, 13.68s/it]
  3%|▎         | 14/480 [03:31<2:21:03, 18.16s/it]
  3%|▎         | 15/480 [03:34<1:47:13, 13.84s/it]
  3%|▎         | 16/480 [04:05<2:24:58, 18.75s/it]
  4%|▎         | 17/480 [04:16<2:06:26, 16.38s/it]
  4%|▍         | 18/480 [04:21<1:40:49, 13.09s/it]
  4%|▍         | 19/480 [04:27<1:25:15, 11.10s/it]
  4%|▍         | 20/480 [04:39<1:25:17, 11.12s/it]
  4%|▍         | 21/480 [05:00<1:47:43, 14.08s/it]
  5%|▍         | 22/480 [05:30<2:24:39, 18.95s/it]
  5%|▍         | 23/480 [05:46<2:17:40, 18.07s/it]
  5%|▌         | 24/480 [05:51<1:48:09, 14.23s/it]
  5%|▌         | 25/480 [05:58<1:30:29, 11.93s/it]
  5%|▌         | 26/480 [06:10<1:31:03, 12.04s/it]
  6%|▌         | 27/480 [06:40<2:11:56, 17.48s/it]
  6%|▌         | 28/480 [07:41<3:49:25, 30.45s/it]
  6%|▌         | 29/480 [08:02<3:27:25, 27.60s/it]
  6%|▋         | 30/480 [08:09<2:39:56, 21.33s/it]
  6%|▋         | 31/480 [08:16<2:09:10, 17.26s/it]
  7%|▋         | 32/480 [08:25<1:50:44, 14.83s/it]
  7%|▋         | 33/480 [08:34<1:35:29, 12.82s/it]
  7%|▋         | 34/480 [08:39<1:17:44, 10.46s/it]
  7%|▋         | 35/480 [08:54<1:27:54, 11.85s/it]
  8%|▊         | 36/480 [09:00<1:15:41, 10.23s/it]
  8%|▊         | 37/480 [09:07<1:07:18,  9.12s/it]
  8%|▊         | 38/480 [09:23<1:22:14, 11.16s/it]
  8%|▊         | 39/480 [10:04<2:28:52, 20.26s/it]
  8%|▊         | 40/480 [10:12<2:01:16, 16.54s/it]
  9%|▊         | 41/480 [10:38<2:22:56, 19.54s/it]
  9%|▉         | 42/480 [10:43<1:49:14, 14.97s/it]
  9%|▉         | 43/480 [10:57<1:47:49, 14.81s/it]
  9%|▉         | 44/480 [11:13<1:49:50, 15.12s/it]
  9%|▉         | 45/480 [11:27<1:46:50, 14.74s/it]
 10%|▉         | 46/480 [11:40<1:42:34, 14.18s/it]
 10%|▉         | 47/480 [11:50<1:34:28, 13.09s/it]
 10%|█         | 48/480 [11:55<1:15:16, 10.46s/it]
 10%|█         | 49/480 [12:04<1:12:23, 10.08s/it]
 10%|█         | 50/480 [12:25<1:35:31, 13.33s/it]
 11%|█         | 51/480 [12:35<1:29:48, 12.56s/it]
 11%|█         | 52/480 [12:48<1:29:32, 12.55s/it]
 11%|█         | 53/480 [12:53<1:14:02, 10.40s/it]
 11%|█▏        | 54/480 [13:12<1:31:41, 12.91s/it]
 11%|█▏        | 55/480 [13:36<1:54:25, 16.15s/it]
 12%|█▏        | 56/480 [13:44<1:37:39, 13.82s/it]
 12%|█▏        | 57/480 [13:54<1:27:53, 12.47s/it]
 12%|█▏        | 58/480 [13:58<1:10:55, 10.08s/it]
 12%|█▏        | 59/480 [14:05<1:04:42,  9.22s/it]
 12%|█▎        | 60/480 [14:12<58:57,  8.42s/it]  
 13%|█▎        | 61/480 [14:18<53:17,  7.63s/it]
 13%|█▎        | 62/480 [14:24<51:03,  7.33s/it]
 13%|█▎        | 63/480 [14:33<53:55,  7.76s/it]
 13%|█▎        | 64/480 [14:37<45:29,  6.56s/it]
 14%|█▎        | 65/480 [14:59<1:17:51, 11.26s/it]
 14%|█▍        | 66/480 [15:15<1:27:11, 12.64s/it]
 14%|█▍        | 67/480 [15:45<2:02:35, 17.81s/it]
 14%|█▍        | 68/480 [15:58<1:53:21, 16.51s/it]
 14%|█▍        | 69/480 [16:04<1:30:57, 13.28s/it]
 15%|█▍        | 70/480 [16:09<1:13:31, 10.76s/it]
 15%|█▍        | 71/480 [16:14<1:02:02,  9.10s/it]
 15%|█▌        | 72/480 [16:39<1:33:29, 13.75s/it]
 15%|█▌        | 73/480 [17:09<2:07:59, 18.87s/it]
 15%|█▌        | 74/480 [17:15<1:39:34, 14.72s/it]
 16%|█▌        | 75/480 [17:29<1:38:46, 14.63s/it]
 16%|█▌        | 76/480 [17:42<1:36:12, 14.29s/it]
 16%|█▌        | 77/480 [17:47<1:15:30, 11.24s/it]
 16%|█▋        | 78/480 [17:56<1:11:10, 10.62s/it]
 16%|█▋        | 79/480 [18:02<1:02:42,  9.38s/it]
 17%|█▋        | 80/480 [18:37<1:53:26, 17.02s/it]
 17%|█▋        | 81/480 [19:22<2:48:32, 25.35s/it]
 17%|█▋        | 82/480 [19:33<2:19:39, 21.05s/it]
 17%|█▋        | 83/480 [20:27<3:24:25, 30.89s/it]
 18%|█▊        | 84/480 [20:45<2:58:52, 27.10s/it]
 18%|█▊        | 85/480 [20:49<2:13:23, 20.26s/it]
 18%|█▊        | 86/480 [20:56<1:46:51, 16.27s/it]
 18%|█▊        | 87/480 [21:15<1:51:09, 16.97s/it]
 18%|█▊        | 88/480 [21:19<1:26:11, 13.19s/it]
 19%|█▊        | 89/480 [21:32<1:25:20, 13.10s/it]
 19%|█▉        | 90/480 [21:44<1:21:50, 12.59s/it]
 19%|█▉        | 91/480 [21:48<1:05:32, 10.11s/it]
 19%|█▉        | 92/480 [21:59<1:06:47, 10.33s/it]
 19%|█▉        | 93/480 [22:05<58:24,  9.06s/it]  
 20%|█▉        | 94/480 [22:10<51:05,  7.94s/it]
 20%|█▉        | 95/480 [22:20<54:46,  8.54s/it]
 20%|██        | 96/480 [22:28<53:46,  8.40s/it]
 20%|██        | 97/480 [22:40<1:00:30,  9.48s/it]
 20%|██        | 98/480 [22:44<49:42,  7.81s/it]  
 21%|██        | 99/480 [22:52<50:38,  7.97s/it]
 21%|██        | 100/480 [23:04<56:57,  8.99s/it]
 21%|██        | 101/480 [23:09<49:14,  7.80s/it]
 21%|██▏       | 102/480 [23:18<51:09,  8.12s/it]
 21%|██▏       | 103/480 [23:25<49:15,  7.84s/it]
 22%|██▏       | 104/480 [23:52<1:25:20, 13.62s/it]
 22%|██▏       | 105/480 [23:59<1:13:40, 11.79s/it]
 22%|██▏       | 106/480 [24:07<1:05:04, 10.44s/it]
 22%|██▏       | 107/480 [24:11<54:00,  8.69s/it]  
 22%|██▎       | 108/480 [24:28<1:09:22, 11.19s/it]
 23%|██▎       | 109/480 [24:55<1:37:05, 15.70s/it]
 23%|██▎       | 110/480 [25:03<1:22:40, 13.41s/it]
 23%|██▎       | 111/480 [25:07<1:05:45, 10.69s/it]
 23%|██▎       | 112/480 [25:44<1:54:53, 18.73s/it]
 24%|██▎       | 113/480 [25:49<1:28:13, 14.42s/it]
 24%|██▍       | 114/480 [26:03<1:27:53, 14.41s/it]
 24%|██▍       | 115/480 [26:33<1:55:30, 18.99s/it]
 24%|██▍       | 116/480 [26:39<1:31:54, 15.15s/it]
 24%|██▍       | 117/480 [26:47<1:18:06, 12.91s/it]
 25%|██▍       | 118/480 [27:16<1:47:42, 17.85s/it]
 25%|██▍       | 119/480 [27:20<1:22:51, 13.77s/it]
 25%|██▌       | 120/480 [27:36<1:25:18, 14.22s/it]
 25%|██▌       | 121/480 [27:41<1:09:57, 11.69s/it]
 25%|██▌       | 122/480 [27:47<58:02,  9.73s/it]  
 26%|██▌       | 123/480 [27:56<57:10,  9.61s/it]
 26%|██▌       | 124/480 [28:02<50:09,  8.45s/it]
 26%|██▌       | 125/480 [28:06<42:31,  7.19s/it]
 26%|██▋       | 126/480 [28:12<39:57,  6.77s/it]
 26%|██▋       | 127/480 [28:20<42:12,  7.17s/it]
 27%|██▋       | 128/480 [28:27<41:12,  7.02s/it]
 27%|██▋       | 129/480 [28:35<43:17,  7.40s/it]
 27%|██▋       | 130/480 [28:38<36:00,  6.17s/it]
 27%|██▋       | 131/480 [28:49<43:22,  7.46s/it]
 28%|██▊       | 132/480 [29:02<53:05,  9.15s/it]
 28%|██▊       | 133/480 [29:17<1:03:45, 11.02s/it]
 28%|██▊       | 134/480 [29:21<50:54,  8.83s/it]  
 28%|██▊       | 135/480 [29:27<46:50,  8.15s/it]
 28%|██▊       | 136/480 [29:32<40:06,  7.00s/it]
 29%|██▊       | 137/480 [29:37<37:22,  6.54s/it]
 29%|██▉       | 138/480 [29:52<52:06,  9.14s/it]
 29%|██▉       | 139/480 [29:59<48:15,  8.49s/it]
 29%|██▉       | 140/480 [30:07<47:20,  8.36s/it]
 29%|██▉       | 141/480 [30:22<58:43, 10.39s/it]
 30%|██▉       | 142/480 [30:36<1:03:41, 11.31s/it]
 30%|██▉       | 143/480 [30:41<53:13,  9.48s/it]  
 30%|███       | 144/480 [30:48<49:17,  8.80s/it]
 30%|███       | 145/480 [30:59<51:27,  9.22s/it]
 30%|███       | 146/480 [31:02<41:49,  7.51s/it]
 31%|███       | 147/480 [31:06<36:09,  6.52s/it]
 31%|███       | 148/480 [31:27<59:07, 10.68s/it]
 31%|███       | 149/480 [31:41<1:05:24, 11.86s/it]
 31%|███▏      | 150/480 [31:54<1:06:22, 12.07s/it]
 31%|███▏      | 151/480 [32:10<1:13:30, 13.41s/it]
 32%|███▏      | 152/480 [32:16<1:00:13, 11.02s/it]
 32%|███▏      | 153/480 [32:20<48:51,  8.96s/it]  
 32%|███▏      | 154/480 [32:39<1:05:49, 12.11s/it]
 32%|███▏      | 155/480 [32:54<1:09:57, 12.91s/it]
 32%|███▎      | 156/480 [33:00<57:36, 10.67s/it]  
 33%|███▎      | 157/480 [33:04<46:55,  8.72s/it]
 33%|███▎      | 158/480 [33:16<52:22,  9.76s/it]
 33%|███▎      | 159/480 [33:21<43:49,  8.19s/it]
 33%|███▎      | 160/480 [33:30<44:54,  8.42s/it]
 34%|███▎      | 161/480 [33:40<48:39,  9.15s/it]
 34%|███▍      | 162/480 [33:48<46:32,  8.78s/it]
 34%|███▍      | 163/480 [33:57<46:22,  8.78s/it]
 34%|███▍      | 164/480 [34:04<42:49,  8.13s/it]
 34%|███▍      | 165/480 [34:17<50:08,  9.55s/it]
 35%|███▍      | 166/480 [34:21<42:44,  8.17s/it]
 35%|███▍      | 167/480 [34:39<57:54, 11.10s/it]
 35%|███▌      | 168/480 [34:53<1:00:57, 11.72s/it]
 35%|███▌      | 169/480 [34:58<50:30,  9.74s/it]  
 35%|███▌      | 170/480 [35:04<44:44,  8.66s/it]
 36%|███▌      | 171/480 [36:06<2:06:42, 24.60s/it]
 36%|███▌      | 172/480 [36:11<1:36:39, 18.83s/it]
 36%|███▌      | 173/480 [36:15<1:13:29, 14.36s/it]
 36%|███▋      | 174/480 [36:20<58:45, 11.52s/it]  
 36%|███▋      | 175/480 [36:25<49:31,  9.74s/it]
 37%|███▋      | 176/480 [36:30<42:10,  8.32s/it]
 37%|███▋      | 177/480 [36:46<53:39, 10.62s/it]
 37%|███▋      | 178/480 [37:32<1:45:40, 20.99s/it]
 37%|███▋      | 179/480 [37:40<1:26:35, 17.26s/it]
 38%|███▊      | 180/480 [37:45<1:07:42, 13.54s/it]
 38%|███▊      | 181/480 [37:50<54:08, 10.87s/it]  
 38%|███▊      | 182/480 [37:57<49:22,  9.94s/it]
 38%|███▊      | 183/480 [38:03<42:08,  8.52s/it]
 38%|███▊      | 184/480 [38:11<42:00,  8.52s/it]
 39%|███▊      | 185/480 [38:17<37:46,  7.68s/it]
 39%|███▉      | 186/480 [38:35<52:18, 10.67s/it]
 39%|███▉      | 187/480 [38:47<54:54, 11.24s/it]
 39%|███▉      | 188/480 [38:56<51:46, 10.64s/it]
 39%|███▉      | 189/480 [39:14<1:01:39, 12.71s/it]
 40%|███▉      | 190/480 [39:21<52:49, 10.93s/it]  
 40%|███▉      | 191/480 [39:26<45:01,  9.35s/it]
 40%|████      | 192/480 [39:31<38:41,  8.06s/it]
 40%|████      | 193/480 [39:49<51:53, 10.85s/it]
 40%|████      | 194/480 [39:54<44:07,  9.26s/it]
 41%|████      | 195/480 [40:02<41:04,  8.65s/it]
 41%|████      | 196/480 [40:10<40:57,  8.65s/it]
 41%|████      | 197/480 [40:18<39:02,  8.28s/it]
 41%|████▏     | 198/480 [40:21<32:11,  6.85s/it]
 41%|████▏     | 199/480 [40:42<52:22, 11.18s/it]
 42%|████▏     | 200/480 [40:52<49:54, 10.70s/it]
 42%|████▏     | 201/480 [41:10<59:26, 12.78s/it]
 42%|████▏     | 202/480 [41:23<1:00:32, 13.07s/it]
 42%|████▏     | 203/480 [41:29<49:44, 10.78s/it]  
 42%|████▎     | 204/480 [41:35<42:42,  9.29s/it]
 43%|████▎     | 205/480 [41:41<38:31,  8.40s/it]
 43%|████▎     | 206/480 [41:50<39:14,  8.59s/it]
 43%|████▎     | 207/480 [42:17<1:04:54, 14.27s/it]
 43%|████▎     | 208/480 [42:32<1:05:09, 14.37s/it]
 44%|████▎     | 209/480 [42:36<50:17, 11.14s/it]  
 44%|████▍     | 210/480 [42:42<43:21,  9.64s/it]
 44%|████▍     | 211/480 [42:46<35:44,  7.97s/it]
 44%|████▍     | 212/480 [42:56<38:25,  8.60s/it]
 44%|████▍     | 213/480 [43:06<39:41,  8.92s/it]
 45%|████▍     | 214/480 [43:16<41:18,  9.32s/it]
 45%|████▍     | 215/480 [43:21<35:29,  8.03s/it]
 45%|████▌     | 216/480 [43:30<37:02,  8.42s/it]
 45%|████▌     | 217/480 [43:48<49:46, 11.36s/it]
 45%|████▌     | 218/480 [44:41<1:43:54, 23.80s/it]
 46%|████▌     | 219/480 [45:00<1:36:17, 22.14s/it]
 46%|████▌     | 220/480 [45:04<1:12:49, 16.81s/it]
 46%|████▌     | 221/480 [45:19<1:10:32, 16.34s/it]
 46%|████▋     | 222/480 [45:28<1:00:12, 14.00s/it]
 46%|████▋     | 223/480 [45:35<51:09, 11.94s/it]  
 47%|████▋     | 224/480 [45:56<1:02:41, 14.69s/it]
 47%|████▋     | 225/480 [46:05<55:17, 13.01s/it]  
 47%|████▋     | 226/480 [46:13<48:13, 11.39s/it]
 47%|████▋     | 227/480 [46:50<1:21:14, 19.27s/it]
 48%|████▊     | 228/480 [46:59<1:07:13, 16.01s/it]
 48%|████▊     | 229/480 [47:03<52:31, 12.55s/it]  
 48%|████▊     | 230/480 [47:36<1:17:46, 18.67s/it]
 48%|████▊     | 231/480 [47:45<1:05:48, 15.86s/it]
 48%|████▊     | 232/480 [48:02<1:06:01, 15.97s/it]
 49%|████▊     | 233/480 [48:11<57:24, 13.95s/it]  
 49%|████▉     | 234/480 [48:35<1:09:50, 17.04s/it]
 49%|████▉     | 235/480 [48:41<55:39, 13.63s/it]  
 49%|████▉     | 236/480 [49:06<1:10:03, 17.23s/it]
 49%|████▉     | 237/480 [49:12<55:50, 13.79s/it]  
 50%|████▉     | 238/480 [49:17<45:07, 11.19s/it]
 50%|████▉     | 239/480 [50:03<1:25:59, 21.41s/it]
 50%|█████     | 240/480 [50:15<1:15:20, 18.84s/it]
 50%|█████     | 241/480 [50:22<1:00:38, 15.22s/it]
 50%|█████     | 242/480 [50:39<1:02:41, 15.80s/it]
 51%|█████     | 243/480 [50:48<53:50, 13.63s/it]  
 51%|█████     | 244/480 [50:54<44:19, 11.27s/it]
 51%|█████     | 245/480 [51:01<39:34, 10.10s/it]
 51%|█████▏    | 246/480 [51:10<38:10,  9.79s/it]
 51%|█████▏    | 247/480 [51:19<36:51,  9.49s/it]
 52%|█████▏    | 248/480 [51:26<34:00,  8.79s/it]
 52%|█████▏    | 249/480 [51:34<33:01,  8.58s/it]
 52%|█████▏    | 250/480 [51:47<38:16,  9.98s/it]
 52%|█████▏    | 251/480 [52:24<1:09:00, 18.08s/it]
 52%|█████▎    | 252/480 [52:32<56:16, 14.81s/it]  
 53%|█████▎    | 253/480 [52:45<54:35, 14.43s/it]
 53%|█████▎    | 254/480 [52:56<50:48, 13.49s/it]
 53%|█████▎    | 255/480 [53:03<43:14, 11.53s/it]
 53%|█████▎    | 256/480 [53:11<38:46, 10.39s/it]
 54%|█████▎    | 257/480 [53:17<33:48,  9.10s/it]
 54%|█████▍    | 258/480 [53:21<27:45,  7.50s/it]
 54%|█████▍    | 259/480 [53:36<36:13,  9.83s/it]
 54%|█████▍    | 260/480 [53:41<30:21,  8.28s/it]
 54%|█████▍    | 261/480 [53:53<34:57,  9.58s/it]
 55%|█████▍    | 262/480 [54:02<33:44,  9.29s/it]
 55%|█████▍    | 263/480 [54:08<30:01,  8.30s/it]
 55%|█████▌    | 264/480 [54:15<27:59,  7.78s/it]
 55%|█████▌    | 265/480 [54:18<23:39,  6.60s/it]
 55%|█████▌    | 266/480 [54:25<23:36,  6.62s/it]
 56%|█████▌    | 267/480 [55:08<1:02:17, 17.55s/it]
 56%|█████▌    | 268/480 [55:15<50:58, 14.43s/it]  
 56%|█████▌    | 269/480 [55:39<1:00:02, 17.07s/it]
 56%|█████▋    | 270/480 [56:06<1:10:19, 20.09s/it]
 56%|█████▋    | 271/480 [56:21<1:05:00, 18.66s/it]
 57%|█████▋    | 272/480 [56:26<50:34, 14.59s/it]  
 57%|█████▋    | 273/480 [56:30<39:36, 11.48s/it]
 57%|█████▋    | 274/480 [56:38<35:18, 10.28s/it]
 57%|█████▋    | 275/480 [57:25<1:12:56, 21.35s/it]
 57%|█████▊    | 276/480 [57:41<1:06:44, 19.63s/it]
 58%|█████▊    | 277/480 [57:48<54:13, 16.03s/it]  
 58%|█████▊    | 278/480 [57:53<42:43, 12.69s/it]
 58%|█████▊    | 279/480 [58:10<46:52, 13.99s/it]
 58%|█████▊    | 280/480 [58:17<39:54, 11.97s/it]
 59%|█████▊    | 281/480 [58:25<35:46, 10.78s/it]
 59%|█████▉    | 282/480 [58:31<30:06,  9.12s/it]
 59%|█████▉    | 283/480 [58:35<24:47,  7.55s/it]
 59%|█████▉    | 284/480 [58:39<21:45,  6.66s/it]
 59%|█████▉    | 285/480 [58:50<25:42,  7.91s/it]
 60%|█████▉    | 286/480 [58:56<23:44,  7.34s/it]
 60%|█████▉    | 287/480 [59:02<22:10,  6.89s/it]
 60%|██████    | 288/480 [59:18<30:59,  9.69s/it]
 60%|██████    | 289/480 [59:23<26:45,  8.41s/it]
 60%|██████    | 290/480 [59:28<23:21,  7.38s/it]
 61%|██████    | 291/480 [59:33<20:34,  6.53s/it]
 61%|██████    | 292/480 [59:46<26:19,  8.40s/it]
 61%|██████    | 293/480 [1:00:04<35:14, 11.31s/it]
 61%|██████▏   | 294/480 [1:00:08<28:07,  9.07s/it]
 61%|██████▏   | 295/480 [1:00:14<25:16,  8.20s/it]
 62%|██████▏   | 296/480 [1:00:28<30:26,  9.93s/it]
 62%|██████▏   | 297/480 [1:00:35<27:58,  9.17s/it]
 62%|██████▏   | 298/480 [1:00:51<33:21, 11.00s/it]
 62%|██████▏   | 299/480 [1:01:20<50:10, 16.63s/it]
 62%|██████▎   | 300/480 [1:01:24<38:18, 12.77s/it]
 63%|██████▎   | 301/480 [1:01:33<34:21, 11.52s/it]
 63%|██████▎   | 302/480 [1:01:57<45:12, 15.24s/it]
 63%|██████▎   | 303/480 [1:02:05<38:32, 13.07s/it]
 63%|██████▎   | 304/480 [1:02:08<30:06, 10.27s/it]
 64%|██████▎   | 305/480 [1:02:25<35:35, 12.20s/it]
 64%|██████▍   | 306/480 [1:02:32<31:10, 10.75s/it]
 64%|██████▍   | 307/480 [1:02:40<28:11,  9.78s/it]
 64%|██████▍   | 308/480 [1:02:45<23:45,  8.29s/it]
 64%|██████▍   | 309/480 [1:02:49<20:20,  7.14s/it]
 65%|██████▍   | 310/480 [1:02:53<17:26,  6.16s/it]
 65%|██████▍   | 311/480 [1:02:59<17:01,  6.04s/it]
 65%|██████▌   | 312/480 [1:03:16<26:04,  9.31s/it]
 65%|██████▌   | 313/480 [1:03:23<24:35,  8.83s/it]
 65%|██████▌   | 314/480 [1:03:38<28:54, 10.45s/it]
 66%|██████▌   | 315/480 [1:04:20<55:23, 20.14s/it]
 66%|██████▌   | 316/480 [1:04:28<44:55, 16.43s/it]
 66%|██████▌   | 317/480 [1:04:50<49:10, 18.10s/it]
 66%|██████▋   | 318/480 [1:04:57<39:20, 14.57s/it]
 66%|██████▋   | 319/480 [1:05:15<41:56, 15.63s/it]
 67%|██████▋   | 320/480 [1:05:22<35:11, 13.20s/it]
 67%|██████▋   | 321/480 [1:05:27<28:15, 10.66s/it]
 67%|██████▋   | 322/480 [1:05:32<23:39,  8.99s/it]
 67%|██████▋   | 323/480 [1:05:38<21:05,  8.06s/it]
 68%|██████▊   | 324/480 [1:05:45<20:07,  7.74s/it]
 68%|██████▊   | 325/480 [1:05:49<17:34,  6.80s/it]
 68%|██████▊   | 326/480 [1:05:55<16:28,  6.42s/it]
 68%|██████▊   | 327/480 [1:06:03<17:24,  6.83s/it]
 68%|██████▊   | 328/480 [1:06:08<15:51,  6.26s/it]
 69%|██████▊   | 329/480 [1:07:06<55:01, 21.86s/it]
 69%|██████▉   | 330/480 [1:07:17<46:32, 18.62s/it]
 69%|██████▉   | 331/480 [1:07:21<35:38, 14.35s/it]
 69%|██████▉   | 332/480 [1:07:29<30:42, 12.45s/it]
 69%|██████▉   | 333/480 [1:07:40<28:58, 11.83s/it]
 70%|██████▉   | 334/480 [1:07:59<33:49, 13.90s/it]
 70%|██████▉   | 335/480 [1:08:10<31:52, 13.19s/it]
 70%|███████   | 336/480 [1:08:19<28:30, 11.88s/it]
 70%|███████   | 337/480 [1:08:30<27:48, 11.67s/it]
 70%|███████   | 338/480 [1:08:52<35:08, 14.85s/it]
 71%|███████   | 339/480 [1:09:00<29:38, 12.61s/it]
 71%|███████   | 340/480 [1:09:06<24:37, 10.55s/it]
 71%|███████   | 341/480 [1:09:17<24:53, 10.74s/it]
 71%|███████▏  | 342/480 [1:09:23<21:48,  9.48s/it]
 71%|███████▏  | 343/480 [1:09:35<23:10, 10.15s/it]
 72%|███████▏  | 344/480 [1:09:39<18:39,  8.23s/it]
 72%|███████▏  | 345/480 [1:09:48<19:15,  8.56s/it]
 72%|███████▏  | 346/480 [1:10:00<21:04,  9.44s/it]
 72%|███████▏  | 347/480 [1:10:03<17:07,  7.73s/it]
 72%|███████▎  | 348/480 [1:10:35<32:35, 14.81s/it]
 73%|███████▎  | 349/480 [1:10:42<27:34, 12.63s/it]
 73%|███████▎  | 350/480 [1:10:46<21:45, 10.04s/it]
 73%|███████▎  | 351/480 [1:10:51<18:03,  8.40s/it]
 73%|███████▎  | 352/480 [1:10:58<17:22,  8.15s/it]
 74%|███████▎  | 353/480 [1:11:06<17:13,  8.14s/it]
 74%|███████▍  | 354/480 [1:11:33<28:55, 13.77s/it]
 74%|███████▍  | 355/480 [1:11:44<26:30, 12.73s/it]
 74%|███████▍  | 356/480 [1:11:54<24:52, 12.04s/it]
 74%|███████▍  | 357/480 [1:12:02<22:22, 10.92s/it]
 75%|███████▍  | 358/480 [1:12:10<20:27, 10.06s/it]
 75%|███████▍  | 359/480 [1:12:16<17:25,  8.64s/it]
 75%|███████▌  | 360/480 [1:12:28<19:43,  9.86s/it]
 75%|███████▌  | 361/480 [1:12:37<18:50,  9.50s/it]
 75%|███████▌  | 362/480 [1:12:42<15:45,  8.01s/it]
 76%|███████▌  | 363/480 [1:12:46<13:31,  6.93s/it]
 76%|███████▌  | 364/480 [1:13:00<17:37,  9.12s/it]
 76%|███████▌  | 365/480 [1:13:29<28:56, 15.10s/it]
 76%|███████▋  | 366/480 [1:14:03<39:33, 20.82s/it]
 76%|███████▋  | 367/480 [1:14:08<30:06, 15.99s/it]
 77%|███████▋  | 368/480 [1:14:28<32:13, 17.26s/it]
 77%|███████▋  | 369/480 [1:14:39<28:29, 15.40s/it]
 77%|███████▋  | 370/480 [1:14:46<23:27, 12.80s/it]
 77%|███████▋  | 371/480 [1:15:01<24:05, 13.26s/it]
 78%|███████▊  | 372/480 [1:15:10<21:51, 12.14s/it]
 78%|███████▊  | 373/480 [1:15:29<25:32, 14.32s/it]
 78%|███████▊  | 374/480 [1:15:38<22:29, 12.73s/it]
 78%|███████▊  | 375/480 [1:15:45<19:07, 10.93s/it]
 78%|███████▊  | 376/480 [1:15:51<16:25,  9.48s/it]
 79%|███████▊  | 377/480 [1:15:58<15:01,  8.75s/it]
 79%|███████▉  | 378/480 [1:16:21<21:56, 12.91s/it]
 79%|███████▉  | 379/480 [1:16:38<23:57, 14.24s/it]
 79%|███████▉  | 380/480 [1:16:45<19:54, 11.95s/it]
 79%|███████▉  | 381/480 [1:16:52<17:09, 10.40s/it]
 80%|███████▉  | 382/480 [1:16:56<13:48,  8.45s/it]
 80%|███████▉  | 383/480 [1:17:08<15:37,  9.66s/it]
 80%|████████  | 384/480 [1:17:28<20:26, 12.78s/it]
 80%|████████  | 385/480 [1:17:36<17:47, 11.23s/it]
 80%|████████  | 386/480 [1:17:42<15:06,  9.64s/it]
 81%|████████  | 387/480 [1:17:47<12:44,  8.22s/it]
 81%|████████  | 388/480 [1:17:51<10:52,  7.10s/it]
 81%|████████  | 389/480 [1:18:04<13:22,  8.82s/it]
 81%|████████▏ | 390/480 [1:18:13<13:32,  9.03s/it]
 81%|████████▏ | 391/480 [1:18:17<11:09,  7.53s/it]
 82%|████████▏ | 392/480 [1:18:26<11:30,  7.84s/it]
 82%|████████▏ | 393/480 [1:18:54<20:05, 13.85s/it]
 82%|████████▏ | 394/480 [1:19:03<17:52, 12.47s/it]
 82%|████████▏ | 395/480 [1:19:22<20:29, 14.47s/it]
 82%|████████▎ | 396/480 [1:19:35<19:22, 13.84s/it]
 83%|████████▎ | 397/480 [1:19:39<15:23, 11.13s/it]
 83%|████████▎ | 398/480 [1:19:49<14:28, 10.59s/it]
 83%|████████▎ | 399/480 [1:19:57<13:26,  9.96s/it]
 83%|████████▎ | 400/480 [1:20:20<18:12, 13.65s/it]
 84%|████████▎ | 401/480 [1:20:35<18:36, 14.13s/it]
 84%|████████▍ | 402/480 [1:20:39<14:40, 11.29s/it]
 84%|████████▍ | 403/480 [1:20:45<12:13,  9.52s/it]
 84%|████████▍ | 404/480 [1:21:05<15:58, 12.61s/it]
 84%|████████▍ | 405/480 [1:21:13<14:17, 11.43s/it]
 85%|████████▍ | 406/480 [1:21:18<11:24,  9.25s/it]
 85%|████████▍ | 407/480 [1:21:22<09:26,  7.76s/it]
 85%|████████▌ | 408/480 [1:21:53<17:49, 14.86s/it]
 85%|████████▌ | 409/480 [1:21:58<14:07, 11.94s/it]
 85%|████████▌ | 410/480 [1:22:07<12:47, 10.97s/it]
 86%|████████▌ | 411/480 [1:22:14<11:15,  9.79s/it]
 86%|████████▌ | 412/480 [1:22:18<09:00,  7.96s/it]
 86%|████████▌ | 413/480 [1:22:31<10:48,  9.68s/it]
 86%|████████▋ | 414/480 [1:22:38<09:44,  8.85s/it]
 86%|████████▋ | 415/480 [1:22:57<12:44, 11.75s/it]
 87%|████████▋ | 416/480 [1:23:06<11:42, 10.98s/it]
 87%|████████▋ | 417/480 [1:23:17<11:21, 10.82s/it]
 87%|████████▋ | 418/480 [1:23:21<09:10,  8.88s/it]
 87%|████████▋ | 419/480 [1:23:30<09:12,  9.05s/it]
 88%|████████▊ | 420/480 [1:23:55<13:46, 13.78s/it]
 88%|████████▊ | 421/480 [1:24:02<11:27, 11.66s/it]
 88%|████████▊ | 422/480 [1:24:13<11:09, 11.54s/it]
 88%|████████▊ | 423/480 [1:24:31<12:49, 13.50s/it]
 88%|████████▊ | 424/480 [1:24:35<09:46, 10.47s/it]
 89%|████████▊ | 425/480 [1:24:38<07:36,  8.30s/it]
 89%|████████▉ | 426/480 [1:24:43<06:37,  7.36s/it]
 89%|████████▉ | 427/480 [1:24:55<07:43,  8.74s/it]
 89%|████████▉ | 428/480 [1:25:00<06:31,  7.53s/it]
 89%|████████▉ | 429/480 [1:25:10<07:02,  8.28s/it]
 90%|████████▉ | 430/480 [1:25:24<08:27, 10.16s/it]
 90%|████████▉ | 431/480 [1:25:36<08:48, 10.78s/it]
 90%|█████████ | 432/480 [1:25:44<07:48,  9.76s/it]
 90%|█████████ | 433/480 [1:25:50<06:46,  8.64s/it]
 90%|█████████ | 434/480 [1:26:06<08:24, 10.97s/it]
 91%|█████████ | 435/480 [1:26:29<10:46, 14.37s/it]
 91%|█████████ | 436/480 [1:26:36<08:59, 12.25s/it]
 91%|█████████ | 437/480 [1:26:44<07:57, 11.10s/it]
 91%|█████████▏| 438/480 [1:26:53<07:10, 10.24s/it]
 91%|█████████▏| 439/480 [1:26:58<06:07,  8.96s/it]
 92%|█████████▏| 440/480 [1:27:05<05:32,  8.31s/it]
 92%|█████████▏| 441/480 [1:27:11<04:48,  7.39s/it]
 92%|█████████▏| 442/480 [1:27:56<11:52, 18.74s/it]
 92%|█████████▏| 443/480 [1:28:00<08:53, 14.43s/it]
 92%|█████████▎| 444/480 [1:28:05<06:58, 11.62s/it]
 93%|█████████▎| 445/480 [1:28:27<08:30, 14.59s/it]
 93%|█████████▎| 446/480 [1:28:30<06:25, 11.33s/it]
 93%|█████████▎| 447/480 [1:28:37<05:26,  9.90s/it]
 93%|█████████▎| 448/480 [1:28:45<05:03,  9.47s/it]
 94%|█████████▎| 449/480 [1:28:56<05:00,  9.69s/it]
 94%|█████████▍| 450/480 [1:29:01<04:07,  8.23s/it]
 94%|█████████▍| 451/480 [1:29:07<03:46,  7.80s/it]
 94%|█████████▍| 452/480 [1:29:12<03:14,  6.95s/it]
 94%|█████████▍| 453/480 [1:29:19<03:07,  6.95s/it]
 95%|█████████▍| 454/480 [1:29:24<02:43,  6.30s/it]
 95%|█████████▍| 455/480 [1:29:34<03:06,  7.45s/it]
 95%|█████████▌| 456/480 [1:29:38<02:34,  6.44s/it]
 95%|█████████▌| 457/480 [1:29:43<02:13,  5.83s/it]
 95%|█████████▌| 458/480 [1:29:50<02:20,  6.37s/it]
 96%|█████████▌| 459/480 [1:30:08<03:26,  9.84s/it]
 96%|█████████▌| 460/480 [1:30:26<04:06, 12.34s/it]
 96%|█████████▌| 461/480 [1:30:31<03:10, 10.04s/it]
 96%|█████████▋| 462/480 [1:30:35<02:27,  8.17s/it]
 96%|█████████▋| 463/480 [1:30:41<02:10,  7.70s/it]
 97%|█████████▋| 464/480 [1:30:51<02:09,  8.12s/it]
 97%|█████████▋| 465/480 [1:31:24<03:55, 15.70s/it]
 97%|█████████▋| 466/480 [1:31:30<02:57, 12.71s/it]
 97%|█████████▋| 467/480 [1:31:36<02:21, 10.89s/it]
 98%|█████████▊| 468/480 [1:31:40<01:44,  8.68s/it]
 98%|█████████▊| 469/480 [1:32:11<02:49, 15.44s/it]
 98%|█████████▊| 470/480 [1:33:07<04:36, 27.62s/it]
 98%|█████████▊| 471/480 [1:33:22<03:34, 23.84s/it]
 98%|█████████▊| 472/480 [1:33:26<02:22, 17.83s/it]
 99%|█████████▊| 473/480 [1:33:31<01:37, 13.98s/it]
 99%|█████████▉| 474/480 [1:33:40<01:15, 12.55s/it]
 99%|█████████▉| 475/480 [1:33:45<00:51, 10.24s/it]
 99%|█████████▉| 476/480 [1:34:03<00:49, 12.47s/it]
 99%|█████████▉| 477/480 [1:34:20<00:42, 14.04s/it]
100%|█████████▉| 478/480 [1:34:33<00:27, 13.57s/it]
100%|█████████▉| 479/480 [1:34:39<00:11, 11.32s/it]
100%|██████████| 480/480 [1:34:48<00:00, 11.85s/it]
In [ ]:
 
In [36]:
scrutinizer.details
Out[36]:
experiment_name        time_series__gru_hpo_1
random_method                uniform_mersenne
reduction_method                         None
reduction_interval                         50
reduction_window                           20
reduction_threshold                       0.2
reduction_metric                     val_loss
complete_time                  07/17/20/09:21
x_shape                         (2358, 1, 36)
y_shape                               (2358,)
dtype: object
In [37]:
analyze_object = talos.Analyze(scrutinizer)
analyze_object.data
Out[37]:
start end duration round_epochs loss mean_absolute_error r_squared val_loss val_mean_absolute_error val_r_squared activation1 activation2 batch_size dropout epochs loss neurons1 neurons2 optimizer
0 07/17/20-074702 07/17/20-074709 7.263751 18 1.050157 0.738348 0.945706 0.237421 0.401475 0.985582 None None 64 0.3 46 mse 64 256 Adam
1 07/17/20-074710 07/17/20-074716 6.009053 34 0.915937 0.668445 0.953623 0.163706 0.325042 0.990206 None None 128 0.1 46 mse 64 128 Adam
2 07/17/20-074716 07/17/20-074733 17.203734 12 1.053062 1.053062 0.865736 0.348090 0.348090 0.982904 <function relu at 0x7fba53cba7a0> <function linear at 0x7fba53cbaa70> 8 0.4 72 mae 128 128 Adam
3 07/17/20-074733 07/17/20-074755 21.905334 16 0.872617 0.872617 0.906110 0.600920 0.600920 0.955830 None None 8 0.4 124 mae 128 128 Adam
4 07/17/20-074755 07/17/20-074803 7.979588 24 0.752482 0.752482 0.944436 0.353825 0.353825 0.988342 <function relu at 0x7fba53cba7a0> <function linear at 0x7fba53cbaa70> 64 0.2 98 mae 32 256 Adam
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
475 07/17/20-092048 07/17/20-092105 17.438523 18 3.157768 1.326822 0.791838 0.238766 0.394963 0.979517 <function relu at 0x7fba53cba7a0> None 8 0.4 124 mse 64 64 Adam
476 07/17/20-092105 07/17/20-092123 17.465008 10 1.547810 0.946998 0.911677 0.235218 0.401160 0.984239 None <function linear at 0x7fba53cbaa70> 16 0.4 46 mse 256 128 RMSprop
477 07/17/20-092123 07/17/20-092135 12.253486 12 1.295143 0.848772 0.913346 0.295154 0.452715 0.979096 None None 8 0.2 72 mse 64 64 RMSprop
478 07/17/20-092136 07/17/20-092141 5.829684 14 1.293972 0.850210 0.931106 0.138816 0.292547 0.991602 <function relu at 0x7fba53cba7a0> <function linear at 0x7fba53cbaa70> 32 0.1 46 mse 64 64 Adam
479 07/17/20-092142 07/17/20-092150 8.683155 31 3.747087 1.450994 0.797253 0.936066 0.906809 0.941654 <function relu at 0x7fba53cba7a0> <function linear at 0x7fba53cbaa70> 32 0.4 124 mse 32 32 Adam

480 rows × 19 columns

In [ ]:
 
In [38]:
# The lowest root mean squared error achieved on validation set 
best_r_squared = analyze_object.high('val_r_squared')
# The lowest mean absolute error achieved on validation set 
best_mae = analyze_object.low('val_mean_absolute_error')

print("The scores for rmse and mae, on validation set are %.5f and %.5f, respectively."%(best_r_squared, best_mae))
The scores for rmse and mae, on validation set are 0.99301 and 0.26511, respectively.
In [ ]:
 
In [43]:
# The best models based on respective metrics
best1 = scrutinizer.best_model(metric='val_r_squared', asc=False)
best2 = scrutinizer.best_model(metric='val_mean_absolute_error', asc=False)

# Predicting the Test set results
y_pred_rsq = best1.predict(x_test)
y_pred_mae = best2.predict(x_test)
r2_1 = r2_score(y_test, y_pred_rsq)
r2_2 = r2_score(y_test, y_pred_mae)
print('R-squared for the mae-based metric, on test set, is %.5f .'%(r2_2))

#
y0 = y_test.flatten()
y1 = y_pred_rsq.flatten()
y2 = y_pred_mae.flatten()
results = pd.DataFrame({"y_test":y0, "y_pred_rsq":y1, "y_pred_mae":y2})

results.tail(30)
R-squared for the mae-based metric, on test set, is 0.75037 .
Out[43]:
y_test y_pred_rsq y_pred_mae
413 15.003 15.567156 12.617137
414 14.742 14.939928 12.086085
415 13.154 13.048302 10.359253
416 10.256 10.165793 7.896068
417 7.424 6.963971 5.091417
418 4.724 4.396383 2.958516
419 3.732 3.247497 2.469219
420 3.500 3.903115 2.693677
421 6.378 6.008095 4.434960
422 9.589 9.156446 7.150556
423 12.582 12.231606 9.771287
424 14.335 14.631231 11.790261
425 14.873 15.546347 12.603047
426 14.875 14.953358 12.045080
427 13.091 13.006264 10.332352
428 10.330 10.137802 7.873150
429 6.713 7.008280 5.101639
430 4.850 4.424886 2.983503
431 3.881 3.289584 2.484737
432 4.664 3.972551 2.740891
433 6.740 6.244617 4.587365
434 9.313 9.345440 7.254114
435 12.312 12.292873 9.798648
436 14.505 14.504245 11.681917
437 15.051 15.453953 12.481429
438 14.755 14.892052 11.961114
439 12.999 13.007525 10.297707
440 10.801 10.160855 7.874524
441 7.433 7.112982 5.168779
442 5.518 4.634850 3.133771
In [ ]:
 
In [40]:
xpoints = df.iloc[-443:, 0]

trace0 = go.Scatter(x=xpoints, y=y0, name='Actual Values')
trace1 = go.Scatter(x=xpoints, y=y1, name='Predicted (RSQ)')
trace2 = go.Scatter(x=xpoints, y=y2, name='Predicted (MAE)')

data = [trace0, trace1, trace2]

layout=go.Layout(title="Actual and Predicted Temperatures", xaxis={'title':'Year'}, yaxis={'title':'Temprature'})
fig = go.Figure(data=data, layout=layout)
fig.show()
In [41]:
# We can now save the best model for further use or deployment

# Get the best model index based on the highest 'validation ROC_AUC' 
model_id = analyze_object.data[['val_r_squared']].idxmax()[0]

# Clear any previous TensorFlow session.
tf.keras.backend.clear_session()

# Load the model parameters from the scanner.
model = model_from_json(scrutinizer.saved_models[model_id])
model.set_weights(scrutinizer.saved_weights[model_id])
model.summary()
model.save('./avg_land_temp_best_model.h5')
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
bidirectional (Bidirectional multiple                  38784     
_________________________________________________________________
gru_1 (GRU)                  multiple                  295680    
_________________________________________________________________
dropout (Dropout)            multiple                  0         
_________________________________________________________________
dense (Dense)                multiple                  257       
=================================================================
Total params: 334,721
Trainable params: 334,721
Non-trainable params: 0
_________________________________________________________________

Conclusion:


As can be seen, a GRU achieves a mind-blowing Coefficiencient of Determination (r_squared) of 99% on completely new data (test set). The curve (red line), of the predicted values perfectly mimics the actual values curve (blue line), almost completely shading it.
In [ ]: